Blog

Former Google CEO Eric Schmidt rejects claims Al scaling has peaked – but firms like OpenAI, Anthropic, and Google are finding it harder and more expensive to deliver


How can the large language models (LLMs) driving the generative AI boom keep getting better? That’s the question driving a debate around so-called scaling laws — and former Google CEO Eric Schmidt isn’t concerned.

Scaling laws refer to how the accuracy and quality of a deep-learning model improves with size — bigger is better when it comes to the model itself, the amount of data it’s fed, and the computing that powers it.


Source link

Related Articles

Back to top button
close